-
-
Notifications
You must be signed in to change notification settings - Fork 757
Fix SSL Certificate Validation Behind Corporate Proxies #913
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
When using LLM behind corporate proxies or firewalls that perform SSL inspection
(like Zscaler), HTTPS certificate validation can fail with connection errors.
This adds support for two environment variables:
- LLM_SSL_CONFIG: Configure SSL verification behavior ('native_tls' or 'no_verify')
- LLM_CA_BUNDLE: Path to a custom CA certificate bundle file
The implementation adds a helper function that configures the OpenAI client's
HTTP transport based on these settings.
Fixes simonw#772
| return input_dict | ||
|
|
||
|
|
||
| def _configure_ssl_client(model_id): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This function doesn't seem to be using the model_id argument for anything...
| if "http_client" not in kwargs: | ||
| kwargs["http_client"] = logging_client() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Doesn't this mean that LLM_OPENAI_SHOW_RESPONSES won't work if a custom SSL configuration is in use?
| try: | ||
| if ssl_config == "native_tls": | ||
| # Use the system's native certificate store | ||
| return DefaultHttpxClient(transport=httpx.HTTPTransport(verify=True)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think this is enough to use the system's native certificate store. Have a look at encode/httpx#2490. The problem with the proposed solution, using Truststore, is that it requires Python3.10 or later.
|
Is there an update with this? It doesn't seem clear if the original PR works to enable native TLS... |
I made a draft PR that I am using daily at work with the only caveat that it needs Python 3.10+. |
|
Sorry, I got caught up in something and didn't circle back. My DAYJOB finally got back to me about ZScaler - they're updating the Root CA in the next ~2w so I won't need this in a few days. Going to close it, lets use @bissonex's version if still relevant. |
This PR adds support for configuring SSL certificate handling when using LLM behind corporate proxies like Zscaler.
Problem
As described in issue #772, users behind corporate proxies or firewalls that perform SSL inspection (like Zscaler) encounter connection errors because the HTTPS certificate validation fails.
Unlike tools like
uv(which has a--native-tlsoption), LLM didn't have a way to configure certificate handling to work in these environments.Solution
This PR adds environment variables to configure SSL certificate handling:
The configuration options include:
LLM_SSL_CONFIG=native_tls: Use the system's native certificate storeLLM_SSL_CONFIG=no_verify: Disable certificate verification (not recommended for production)LLM_CA_BUNDLE=/path/to/cert.pem: Use a custom CA bundle fileImplementation Details
_configure_ssl_clientthat reads environment variables for SSL configurationget_clientmethod